Importance Sampling using Rényi divergence

نویسنده

  • Emanuel Florentin Olariu
چکیده

We present an alternative approach to the problem of estimating probabilities of rare events and for optimization problems using the class of Rényi divergences of order α > 1. The general procedure we describe does not involve any specific family of distributions, the only restriction is that the search space consists of product form probability density functions. We discuss an algorithm for estimation of probability of rare events and a version for continuous optimization. The results of numerical experimentation with these algorithms carried in the last section support their performances.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bhattacharyya error and divergence using variational importance sampling

Many applications require the use of divergence measures between probability distributions. Several of these, such as the Kullback Leibler (KL) divergence and the Bhattacharyya divergence, are tractable for single Gaussians, but intractable for complex distributions such as Gaussian mixture models (GMMs) used in speech recognizers. For tasks related to classification error, the Bhattacharyya di...

متن کامل

Optimal transport and Rényi informational divergence

Transport-entropy inequalities are considered in terms of Rényi informational divergence.

متن کامل

The Importance Sampling Technique for Understanding Rare Events in Erdős-rényi Random Graphs

In dense Erdős-Rényi random graphs, we are interested in the events where large numbers of a given subgraphs occur. The mean behaviour of subgraph counts is known, and only recently were the related large deviations results discovered. Consequently, it is natural to ask, what is the probability of an Erdős-Rényi graph containing an excessively large number of a given subgraph? Using the large d...

متن کامل

Image Registration and Segmentation by Maximizing the Jensen-Rényi Divergence

Information theoretic measures provide quantitative entropic divergences between two probability distributions or data sets. In this paper, we analyze the theoretical properties of the Jensen-Rényi divergence which is defined between any arbitrary number of probability distributions. Using the theory of majorization, we derive its maximum value, and also some performance upper bounds in terms o...

متن کامل

On Rényi Divergence Measures for Continuous Alphabet Sources

The idea of ‘probabilistic distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, has been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. While the closely related concept of Rényi ent...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012